Skip to content

Conversation

@ver217
Copy link
Contributor

@ver217 ver217 commented Nov 18, 2021

add an example of ViT-B/16 and remove w_norm clipping in LAMB

FrankLeeeee and others added 2 commits November 18, 2021 19:45
* Add gradient accumulation, fix lr scheduler

* fix FP16 optimizer and adapted torch amp with tensor parallel (#18)

* fixed bugs in compatibility between torch amp and tensor parallel and performed some minor fixes

* fixed trainer

* Revert "fixed trainer"

This reverts commit 2e0b0b7.

* improved consistency between trainer, engine and schedule (#23)

Co-authored-by: 1SAA <[email protected]>

Co-authored-by: 1SAA <[email protected]>
Co-authored-by: ver217 <[email protected]>
@ver217 ver217 merged commit f734f4b into hpcaitech:feature/vitb16 Nov 18, 2021
@ver217 ver217 deleted the feature/vitb16 branch November 19, 2021 03:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants